专利摘要:
STATISTICAL ANALYSIS WITH WINDOWS FOR DETECTION OF ANOMALY IN GEOPHYSICAL DATA SETS The present invention relates to a method for identifying geological resources from geophysical or attribute data using the main component analysis with windows (22), independent components or diffusion mapping (61). Subtle features are identifiable in volumes of partial or residual data. Volumes of residual data (24) are created (36) by eliminating the data captured by the most prominent major components (14). Partial data volumes are created (35) by projecting the data (21) onto the selected main components (22, 61). Geological features can also be identified from the pattern analysis (77) or anomaly volumes (62, 79) generated with a variable scale data similarity matrix (73). The method is suitable for identifying the physical resources indicative of the hydrocarbon potential.
公开号:BR112012023687B1
申请号:R112012023687-3
申请日:2011-03-17
公开日:2020-11-03
发明作者:Krishnan Kumaran;Jingbo Wang;Stefan Hussenoeder;Dominique G. Gillard;Guy F. Medema;Robert L. Brovey;Pavel Dimitrov;Matthew S. Casey;Fred W. Schroeder
申请人:Exxonmobil Upstream Research Company;
IPC主号:
专利说明:

Cross Reference to Related Orders
[0001] This claim claims benefit to U.S. application number 12 / 775,226, filed on May 6, 2010, entitled Windowed Statistical Analyzes for Anomaly Detection in Geophysical Datasets. US application number 12 / 775,226 is partly a continuation of PCT International application number PCT / US09 / 059044, filed on September 30, 2009, which claims the benefit of provisional application US 61 / 114,806, which was filed on November 14 2008 and provisional application US 61 / 230,478, filed on July 31, 2009, their descriptions are incorporated in this document in their entirety for reference. Field of the Invention
[0002] The invention refers mainly and generally to the field of geophysical prospecting and, more particularly, to a method for processing geophysical data. Specifically, the invention consists of a method for highlighting regions in one or more sets of geological or geophysical data, such as seismic data, which represent real geological features that include potential hydrocarbon accumulations without the use of previous training data, and where the desired physical resources can appear in the raw data only in a subtle form, obscured by more prominent anomalies. Background of the Invention
[0003] Seismic data sets often contain complex patterns that are subtle and manifest in multiple seismic or attribute / derivative volumes and at multiple spatial scales. Over the past few decades, geologists and geophysicists have developed a range of techniques to extract very important patterns that indicate the presence of hydrocarbons. However, many of these methods involve searching for known or imprecisely defined patterns with pre-specific characteristics in one volume of data, or two volumes at most. These "mold-based" or "model-based" approaches often miss subtle or unexpected anomalies that do not fit those specifications. These approaches will not be further discussed here, as they have little in common with the present invention, except that it faces the same problem as the technique.
[0004] Most of these known methods involve a human interpreter search for known or imprecisely defined patterns with pre-specific characteristics in one volume of data, or two volumes at most. These "mold-based" or "model-based" approaches often miss miss miss subtle or unexpected anomalies that do not fit those specifications. Therefore, it is desirable to develop methods of statistical analysis that are able to automatically highlight anomalous regions in one or more volumes of seismic data across multiple spatial scales without prior knowledge of what they are or where they are. The present invention satisfies this need. Summary of the Invention
[0005] In one embodiment, the invention consists of a method for identifying geological features in one or more discretized sets of geophysical data or data attribute (each such data set referred to as an "original data volume") representing a region underground, comprising: (a) selecting a data window format and size; (b) for each original data volume, move the window to a plurality of locations, and form for each window location a data window vector whose components consist of voxel values from that window; (c) perform a statistical analysis of the data window vectors, the statistical analysis being carried out together in the case of a plurality of original data volumes; (d) use statistical analysis to identify outliers or anomalies in the data; and (e) use the outliers or anomalies to forecast the geological resources of the underground region. In one embodiment of the invention, the technique of statistical analysis is diffusion mapping, which computes a base that represents the data. This basis is the result of a non-linear transformation, which provides a parameter that defines a notion of scale.
[0006] The geological resources that are identified using the present inventive method can then be used to predict the presence of hydrocarbon accumulations. Brief Description of Drawings
[0007] Due to patent law restrictions, Figures 1A-C, 2 and 6 are black and white reproductions of the original colors. The original colors were registered in U.S. counterpart order, U.S. order number 12 / 775,226. Copies of the publication of this patent application with colored drawings can be obtained from the U.S. Patent and Trademark Office upon request and payment of the necessary fee.
[0008] The present invention and its advantages will be better understood by referring to the following detailed description and the attached drawings in which:
[0009] As an exemplary test application of the present inventive method, Figure 1A shows an image (2D time slice) from a 3D volume of synthetic seismic data; Figure 1B shows the residual of the original image generated by the present inventive method, defined by the first sixteen main components, which represent 90% of the information; and Figure 1C illustrates the first sixteen main components in the form of a 30 x 30 window;
[0010] Figure 2 is a schematic representation of the basic steps in a modality of the present inventive method that uses residual analysis;
[0011] Figure 3 is a flowchart showing the basic steps of applying a PCA modality equipped with windows of the present invention in multiple volumes of data using a single window size;
[0012] Figures 4A-B show a representation of a 2D slice of a data volume (large rectangle) and a sample of this data (small rectangle) for different pixels in a window, Figure 4A showing the sample data for the pixel (1,1) and Figure 4B showing the sample of data for the i-th pixel;
[0013] Figures 5A-B show the subdivision of data, not in the sample for the 2D data set of Figures 4A-B, for efficient computation of the covariance matrix;
[0014] Figure 6 is a schematic diagram showing a diffusion mapping modality of the present invention for two different values of the scale parameter; and
[0015] Figure 7 is a flow chart showing the basic steps in a diffusion mapping modality of the present invention.
[0016] The invention will be described in connection with the exemplary modalities. As the following description is specific to a particular modality or particular use of the invention, it is intended to be illustrative only, and should not limit the scope of the invention. On the contrary, it is intended to cover all alternatives, modifications and equivalents that may be included within the scope of the invention, as defined by the appended claims. Detailed Description of Exemplary Modalities
[0017] The present invention is a method for detecting anomalous patterns in multiple seismic volumes or other geophysical data (for example, electromagnetic data) over multiple spatial scales without the use of previous training data. The inventive method is based on the Statistical Analysis Equipped with Windows, which involves the following basic steps in an embodiment of the invention: 1. Extract a statistical distribution of the data within the windows of size and format specified by the user. Standard statistical techniques, such as Principal Component Analysis (PCA), Independent Component Analysis (ICA), Cluster Analysis can be used. 2. Extract the anomalous regions of the data (a) by computing the probability of occurrence (or equivalent metric) of each data window in the extracted distribution (b) identify the low probability data regions as possible anomalies.
[0018] Extracting a statistical distribution is not a necessary step in the present invention. Anomalies or outliers can be identified from statistical analysis, either directly or through other techniques in addition to a statistical distribution; for example, some embodiments of the invention use diffusion mapping. See Figures 6 and 7.
[0019] A particularly convenient embodiment of the invention involves a combination of Windows Principal Component Analysis ("WPCA"), Residual Analysis and Cluster Analysis which will be described in detail below below. However, any element of common knowledge in the field of technique will readily assess how other techniques of statistical analysis can be used or adequately adapted to achieve the same goals.
[0020] A useful generalization of Principal Component Analysis ("PCA") is a method known as Independent Component Analysis ("ICA"), which is preferable when the data differs strongly from the standard multidimensional Gaussian distribution. In this case, the present inventive method uses, in a correspondingly generalized way, the ICA Equipped with Windows ("WICA"), followed by a generalization of Residual Analysis, called Outlier Detection. In one embodiment, the present invention uses PCA in mobile windows, followed by the computation of internal products and residual data from the Principal Components ("PCs"), which are believed to be advantageously applicable not only in seismic applications, however, to the across the broader field of multidimensional data processing. This includes the image, speech and signal processing fields.
[0021] Principal Component Analysis ("PCA") is a well-known classic technique for data analysis, first proposed by Pearson ("On Lines and Planes of Closest Fit to Systems of Points in Space", Philos. Magazine v. 2, pp. 559- 572 (1901)) and further developed by Hotelling ("Analysis of a Complex of Statistical Variables Into Principal Components", Journal of Education Psychology v. 24, pp. 417-441 (1933)). it is known that the first known application of principal component analysis for seismic data occurred in the form of the Karhunen-Loeve transform, later named Kari Karhunen and Michel Loeve (Watanabe, "Karhunen-Loeve Expansion and Factor Analysis", Transactions of the Fourth Prague Conference, J. Kozesnik, ed., Prague, Czechoslovakia Academy of Science (1967)). This method uses the PCA to describe the information content in a set of seismic traces, with the shape of the input data set being all seismic traces, no o multidimensional windows of variable size. Watanabe's primary application is to decompose all seismic data, and to use the first traces of the main component to reconstruct the more coherent energy that filters, in this way, non-geological noise.
[0022] PCA is most commonly used in seismic analysis to reduce the number of measurement characteristics in a statistically independent set of attributes (see, for example, Fournier & Derain, "A Statistical Methodology for Deriving Reservoir Properties from Seismic Data", Geophysics v. 60, pp. 1437-1450 (1995); and Hagen, "The Application of Principal Components Analysis to Seismic Data Sets", Geo exploration v. 20, pp. 93-111 (1982)). The seismic interpretation process often generates numerous products derived from the original data. Since these attributes correlate to varying degrees, PCA is an elegant way to reduce the number of attributes, while retaining a large amount of information.
[0023] Until today, it is believed that there are no statistical techniques for detecting outliers based on a moving window dedicated to finding geological resources of interest on a basis of delimitation and recognition in geological and geophysical data. However, such techniques have been applied in specific subsets or domains of seismic data for signal processing applications or specialized reservoir characterization. Key and Smithson ("New Approach to Seismic Reflection Event Detection and Velocity Determination", Geophysics v. 55, pp. 1057-1069 (1990)) apply PCA to 2D moving windows on pre-stacked seismic data, and the eigenvalue reason results as a measure of signal coherence. No use is made of the main components themselves to detect the resources in the pre-stacked seismic data. Sheevel and Payrazyan ("Principal Component Analyzes Applied to 3D Seismic Data for Reservoir Property Estimation", Society of Petroleum Engineers Annual Conference and Exhibition (1999)) calculate the main trace-based components using small mobile vertical windows 1D, and insert these PCs that appear more geological in a classification algorithm that predicts reservoir properties away from well calibration. Again, this unique 1D dataset approach does not attempt to automatically identify anomalies or outliers in the data. Cho and Spencer ("Estimation of Polarization and Slowness in Mixed Wavefields ”, Geophysics v. 57, pp. 805-814 (1992)) and Richwalski et al. ("Practical Aspects of Wavefield Separation of Two-Component Surface Seismic Data Based on Polarization and Slowness Estimates", Geophysical Prospecting v. 48, pp. 697-722 (2000)) use the windowed 2D PCA in the frequency domain to model the propagation of a pre-defined number of P- & S waves.
[0024] The objective of Wu et al. ("Establishing Spatial Pattern Correlations Between Water Saturation Time -Lapse and Seismic Amplitude Time -Lapse", Petroleum Society's 6th Annual Canadian International Petroleum Conference (56th Annual Technical Meeting) (2005)) is to optimally correlate the unique seismic volumes or range of time to flow simulation data in a reservoir model to estimate the actual saturation time interval values of spatial patterns. Your approach consists of making point-to-point comparisons, not in the original data volumes, but in the projection of these data on the first major eigenvector of PCA analysis, so its aim is to correlate seismic data with a known model, rather than to identify anomalous patterns in seismic data.
[0025] Bishop's US patent 5,848,379 ("Method for Characterizing Subsurface Petrophysical Properties Using Linear Shape Attributes", (1998)) describes a method for predicting underground rock properties and classifying seismic data for analysis of facies or texture, not to identify geological features of interest on a delimitation and recognition basis that consists of the technical problem faced by the present invention Bishop performs statistical analysis using the PCA to decompose the seismic traces into a linear combination of orthogonal waveform bases called Linear Formats within a pre-specified time range or depth. A Linear Format Attribute (LSA) is defined as the subset of weights (or eigenvalues) used to reconstruct a particular stroke format. Also, Bishop does not describe overlapping windows, that simultaneously analyze multiple volumes of data, or the use of a statistical distribution to detect regions anomalous data.
[0026] Other approaches to statistically analyze geological and geophysical data used methods, such as Artificial Neural Networks, Genetic Algorithms and multi-point statistics, but not for the purpose of automatic detection of anomalous patterns. In addition, these methods are generally of limited success, since their internal work is often obscure, and they often require and are highly dependent on large amounts of training data.
[0027] As previously established, PCA and ICA are methods that are commonly used to separate high dimensional (ie, multi-variable or attribute) signals into statistically incorrect (ie, independent) components. The PCA and ICA equipped with windows of the present invention apply component analysis to a data set that is derived from the original data by representing each point (in some embodiments of the invention) in the original data as a collection of points in its neighbor (ie , window). To illustrate this concept with reference to the flowchart in Figure 3, the implementation of WPCA on a single three-dimensional data volume using a fixed window size is outlined below. The same procedure or its equivalent ICA can be applied to 2D data, or simultaneously for multiple volumes of 2D or 3D data. (See step 31 in Figure 3). A 3D seismic volume of size Nx x Ny x Nz is considered:
[0028] (Step 32) Select a format (for example, ellipsoid or cuboid) and the window size (for example, radio r, nxxnyx nz)
que contém valores de voxel dentro de cada vizinho dotado de janelas do voxel. Embora usada na modalidade que é descrita, a amostragem exaustiva não é requerida para a presente invenção. Outras estratégias de amostragem, tais como, potencialmente aleatórias ou lado a lado podem ser usadas em vez disso.[0029] Each voxel in the 3D seismic data volume Ii, j, k, is represented as a dimensional vector nxxnyx nz which contains voxel values within each neighbor with voxel windows. Although used in the embodiment that is described, exhaustive sampling is not required for the present invention. Other sampling strategies, such as, potentially random or side-by-side, can be used instead.
(N = (Nx - nx) x (Ny - ny) x (Nz - nz) destes) da seguinte maneira:[0030] (Step 33) Compute the mean and covariance matrix of all n-dimensional vectors (n = nxxnyxnz) (N = (Nx - nx) x (Ny - ny) x (Nz - nz) of these) as follows:
onde t ek são dois índices do vetor I e, deste modo, representam dois conjuntos de coordenadas espaciais em três dimensões.[0031] Compute the correlation matrix as where t and k are two indices of vector I and thus represent two sets of spatial coordinates in three dimensions.
e autovetores (Componentes principais) * de.De maneira alternativa, os autovalores da matriz de covariância podem ser computados; eles irão diferir dos autovalores da matriz de correlação apenas através de um fator de escalonamento. Estes autovetores serão nx x ny x nz em tamanho, e quando remodelados a partir de sua forma de vetor de volta à forma de janela, representam os diversos padrões espaciais (independentes) nos dados, ordenados a partir do mais comum para o menos comum. Os autovalores correspondentes representam quantos dados originais (isto é, a quantidade de variância) que cada autovetor representa.[0032] (Step 34) Calculate the eigenvalues (Main Values) and eigenvectors (Main components) * in Alternatively, the eigenvalues of the covariance matrix can be computed; they will differ from the eigenvalues of the correlation matrix only through a scaling factor. These eigenvectors will be nx x ny x nz in size, and when reshaped from their vector form back to window form, they represent the various spatial (independent) patterns in the data, ordered from the most common to the least common. The corresponding eigenvalues represent how many original data (that is, the amount of variance) that each eigenvector represents.
' e é um vetor na direção de B . (b) (Etapa 36) Residual: O sinal restante no volume original que não é capturado pelos primeiros Componentes Principais k - 1 (isto é, mais comum). Em uma modalidade preferida da invenção, isto pode ser atingido projetando-se o volume sísmico centralizado e normalizado médio sobre o subespaço transposto por de modo que, onde R é um limite definido pelo usuário entre 0 e 1.[0033] Generate one or more of the following partial volumes of seismic or attribute data which are then examined for anomalies that may not have been apparent from the original data volume: (a) (Step 35) Projection: The portion of the original data that can be recreated using each Principal Component or groups of Principal Components (chosen, for example, from cluster analysis). This is achieved by adopting the internal product of the centralized and normalized seismic volume average in each Principal Component or groups of Principal Components. Thus, the projection of vector A on vector B means 'and is a vector in the direction of B. (b) (Step 36) Residual: The signal remaining in the original volume that is not captured by the first Principal Components k - 1 (ie, most common). In a preferred embodiment of the invention, this can be achieved by projecting the average centralized and normalized seismic volume over the subspace transposed by so that , where R is a user-defined limit between 0 and 1.
[0034] Alternatively, someone can add lower-upper projections, however, this can be computationally more laborious in most cases. (c) Outlier: The residual analysis of item (b) is the way in which the "degree of anomaly" of each voxel is determined in an embodiment of the invention. The attribute data volumes of (a) and (b) are not needed in an alternative way of computing the "degree of anomaly" of each voxel, which will be denoted as R '(since this is related to, however, not equal to residue R defined above), and is provided by the following formula:
[0035] Using this measure of the degree of anomaly, a partial data volume is developed.
[0036] This measure also selects "outliers" that are located in the space transposed by the first few eigenvectors, however, it can be more computationally intensive than the two steps above, in some cases. However, it can be noted that in this case, step 34 above can be skipped, or simply replaced by a Cholesky decomposition of the correlation matrix, which allows for a faster evaluation of R.
[0037] There are variants of the basic approach above that employ different data normalization schemes. The method can extend to an arbitrary number of seismic volumes. The adjustable parameters that the user can experiment with are (1) the window format, (2) the window size and (3) the residual projection R limit.
[0038] The result of applying a 3x3 WPCA to a two-dimensional slice of seismic data is shown in Figures 1 A-C. Figure 1 A shows an image (2D time slice) of a volume of synthetic 3D seismic data. In actual practice, this display can typically be colored, where the colors indicate the amplitudes of seismic reflection (for example, blue = positive, red = negative). Figure 1B shows the residual of the original image after the first sixteen main components have represented 90% of the information. The residue has high values in anomalous standards which, in this case, are flaws. In a colored version of Figure 1B, blue can indicate a low residual amount and warmer colors can highlight the anomalous fault system that can now be clearly seen in the residual display of Figure 1B. In Figure 1C, the sixteen top principal components 14 (ie, first) are shown in their 30 x 30 window form. Failures can be observed to be captured in several of the main components in the bottom two lines.
[0039] The result of applying a 9x9 WPCA to a two-dimensional synthetic seismic cross section is shown in the schematic flowchart of Figure 2. In 21, a 2D cross section from a volume of synthetic 3D seismic data is displayed. Colors can typically be used to represent the range of seismic reflection. A small 8-ms anticline, too subtle to detect with the eyes, is incorporated into the horizontal background reflectivity. The first to fourth main components (eigenvectors) of the input image are displayed in 22. Screen 23 shows the projection of the original image in the first four eigenvectors, which represent 99% of the information. Screen 24 shows the residual after the projected image is subtracted from the original. A subtle built-in feature is now revealed at a depth (bidirectional travel time) of about 440 ms between the dash numbers (which measures the lateral position in a dimension) 30-50. On a color screen, 'warm' colors can be used to reveal the location of the built-in subtle feature.
[0040] The flow chart of Figure 3 outlines a modality of the present inventive method in which the WPCA is applied to multiple volumes of data using a single window size.
[0041] Generalizations and Efficiencies in the Construction of Canonical Standards
[0042] The following sections describe improvements in the main component analysis with windows of the present invention that allow a more convenient applicability through reduced computation, and better use of the results through the interpretation of Principal or Independent Components and their retention or selective removal.
pθra avaliar a matriz de covariância de janelas de tamanho K (N O momento médio e o segundo momento das entradas podem ser computados da seguinte maneira:[0043] Computational Efficiency: The objective method for computing the covariance matrix above is computationally laborious for large data sets, both in memory and processor requirements. An alternative method, therefore, is described in this document that explores the fact that the individual vectors of the PCA are windows that move through the data. For example, we consider a 1-D data set with values pθra evaluate the covariance matrix of windows of size K (NO mean moment and second moment of the inputs can be computed as follows:
[0044] It can be noted that this method involves only adopting medium and internal products of data subvectors (submatrices in higher dimensions) and, therefore, avoids the storage and manipulation of numerous smaller windows derived from the original data. This modification of the computational method thus allows object-oriented software with efficient matrix indexing (such as Matlab and the use of Accumulated Area Tables, a data structure described by Crow in "Summed-Area Tables for Texture Mapping ”, Computer Graphics 18, 207 (1984)) to compute the covariance matrices with minimal storage and computational effort.
[0045] Alternatively, computational efficiency can be achieved by representing the computation of the covariance matrix as a series of cross-correlation operations in progressively smaller regions. To illustrate the approach, a two-dimensional data set is considered, as shown in Figures 4A-B of size n = nx * ny and a two-dimensional window of size m = mx * my.
[0046] The correlation matrix w (t, k) can then be obtained by first computing the measurement of each sample of data which then computes an internal product matrix, and then normalizes the matrix and subtracts the averages.
[0047] First, averages can be computed by convolution of the data volume with a kernel the size of the data sample (for example, DS1) consisting of all inputs equal to 1 / (number of pixels in DS1 ). The result of this operation creates a large matrix, however, the averages are the values located in a window of size m located in the upper left corner of that exit. In general, this type of operation will be denoted corrW (kernel, data) and its result is a window of size m as above. The performance of the operation using a Fast Fourier Transform (FFT) adopts an proportional an * log (n) time and is independent of the size of the sampling window. This FFT approach is faster than the explicit one when m is sufficiently larger than log (n).
[0048] Secondly, an internal product matrix U (t, k) is computed when performing a series of corrW operations on the subsamples of the data sets. It can be noted that line i of this matrix, denoted U (i, :), can be computed as U (i, :) = corrW (DSi, data). Therefore, the population of the matrix in this way adopts the time proportional to m * n log (n), or better. However, it is more advantageous to compute U (t, k) by performing several corrW operations on various subregions of the data. In particular, one can rewrite corrW (DSi, data) = corrW (data, data) - corrW (data, DNSi)] where corr W (data, DNSi) denotes the cross correlation of DNSi with the data in the vicinity of DNSi, that is within mx or my of the DNSi location. The corr W (data, data) operation needs to be performed only once for all lines and then corr W (data, DNSi) needs to be computed multiple times. The advantage arises from the fact that DNSi is typically much smaller than the size of the data set, so corrW (data, DNSi) is a cross correlation over a much smaller entry than corrW (data, data). Similarly, corrW (data, DNSi) computing can be broken down into several corrW operations in even smaller sub-regions.
[0049] Large parts of DNSi are the same for different samples and differ only over one dimension of the sampling window at once. For example, consider the illustration in Figures 5A-B. The regions in Figure 5A denoted by A, B and C together form the entire area of the data volume that is not sampled by pixel 1. That is, the area that can be further subdivided to perform less calculations. The "vertical" area transposed by A and C is considered and compared to a DSi from a different sampling region, as shown in Figure 5B. The analogous vertical area is crossed by the union of several smaller regions: C1 + C2 + C3 + C4 + A1 + A2. (The equivalent division for region B in Figure 5A is the union of B1 + B2 in Figure 5B.) In general, there are only distinct mx as possible areas, each corresponding to a unique DSi side location. In other words, the data contained in A + C will be the same for many different samples of DSi data, so they need to be manipulated only mx times - a savings of my calculations in that area. Therefore, the calculation of corrW (data, DNSi) can be optimized in this way and computed according to corrW (data, DNSi) = corrW (data, A + C) + corrW (data, B + C) - corrW (data, C) oπde the regions denoted by a letter signify the union of all regions labeled with this letter and a number; for example, C in the equation refers to region C in Figure 5A and C1 + C2 + C3 + C4 in Figure 5B, so A + C is represented by A1 + A2 + C1 + C2 + C3 + C4 in Figure 5B . Since the computation of corrW (data, A + C) needs to be performed only once for my lines of U (t, k) and, similarly to corrW (data, B + c), then only the part that needs to be computed for each row is corrW (data, C). Efficiency gains arise from the fact that the region denoted by C is, typically, significantly smaller than the other regions. Proceeding in this way the algorithm extends to data sets and 3-D windows (and, in fact, in any dimension).
[0050] Finally, the cross-correlation matrix w (t, k) is obtained by properly normalizing the U matrix and subtracting the means. W {t, k) = U (t, k) / nDS - mean (DSt) * mean (DSk) where nDS is the number of elements in each sample of data.
[0051] Use of Masks: For very large data sets, even the computational efficiencies described above may not be sufficient for the available computational resources to yield results in a timely manner. In such cases, one can apply (a) an internal product calculation with eigenvectors or (b) a Principal Component calculation in a predefined mask. A k mask is a spatial subset of the data on which the calculations are performed. The mask can be generated (a) interactively by the user or (b) automatically using derived attributes. An example of (b) may be the pre-selection of regions of data that have high local gradients that use gradient estimation algorithms. Domestic product computation is more laborious than Principal Component calculations, which motivates applying a mask to one or both calculations as needed.
[0052] Applications of Canonical Standards
[0053] In addition, the Main Components / independent computed can be grouped into groups that represent similar patterns measured by texture, chaos or other characteristics. Along with the Residual volume, the projection of the original seismic data onto individual, or Principal Component groups will generate a large amount of seismic volumes derived with enhanced anomalous patterns. These modalities of the present inventive method will be described in more detail below.
[0054] Multiple Windows / Spatial Scales: In addition, it is possible to optimize the effort in computing covariance matrices for multiple window sizes housed in a hierarchical order, compared to the direct way of computing them at once. Again, we consider that dimensional example with two window sizes Ki <K2. The mean moment and the second moment for K2 are computed first using the method above, after that, the same quantities for Ki can be computed as follows:
[0055] Note that the formulas above allow the computation of quantities for smaller windows with extra effort. It is easy to extend this method to a sheltered series of larger windows.
[0056] Use of Main Components and Projections: There are many possible ways in which the Main Components and the projections generated by the present inventive method can be used, combined and visualized. A preferred implementation involves identifying anomalies using the residual, as described above. An equally valid approach is to perform selective projections of the original data on a chosen subset of PCs. The subset can be chosen (a) interactively by the user, or (b) automatically using computational metrics on the PCs. An example of ( b) it can be the selection of PCs that have features that look like "channels" or tubular structures that use an automatic geometric algorithm. Another example can be to reduce the noise in the inserted data by creating a projection that excludes "noisy" PCs, using a noise detection algorithm or dispersion metric. People working in the field of technique will recognize other examples from this description.
[0057] Alternative useful ways of viewing projection results in various window sizes include viewing (a) user-selected or automatically selected combinations of PC projections, (b) residuals at various residual limits, or (c) noise components. Another useful variant includes the visualization of a "rating volume", which involves the color coding of each fingerprint location with a color that uniquely determines which PC projection has the highest value at that location.
[0058] Iterative WPCA: The residual volume created by the workflow outlined in Figure 3 has been found to show large values in areas that contain more anomalous patterns. As a consequence, more subtle patterns in the input data are often masked by more obvious anomalies in the residual volume. To increase the sensitivity of the WPCA to extremely subtle patterns, two alternative iterative approaches can be used:
[0059] Iterative eigenvector removal: This first alternative procedure may include the following steps: 1. Perform the first four steps of the flowchart in Figure 3 (through the eigenvector and eigenvalue generation). 2. Identify those eigenvectors whose projections reconstruct a large amount of the background signal and most obvious anomalies. 3. Project the data only on the subset of eigenvectors that were not identified in the previous step (the background signal and that of the most obvious anomalies must be attenuated in this projected image). 4. Perform WPCA on the projected image generated in the previous step. 5. Repeat steps 1 - 3 as needed.
[0060] Iterative Masking Data Removal: This second alternative procedure can include the following steps: 1. Perform the first four steps of Figure 3 (through the generation of eigenvector and eigenvalue). 2. By examining several residual volumes, identify those areas in the input data that correspond to the most obvious anomalies. 3. Perform the WPCA on the data data, excluding those areas identified a. Set all attribute values in those areas to zero before the WPCA analysis, or b. Do not include those areas as an entry to the WPCA. 4. perform the WPCA on the new data set. 5. Repeat steps 1 - 3 as needed.
[0061] WPCA Classification: The Main Components can be used to classify the image based on the intensity of the projections. Such classification will help to identify the regions with specific patterns represented in the chosen Principal Components through convenient visualization, especially when the original data consists of multiple volumes. This variation can include the following steps: 1. Perform steps 31-34 in Figure 3 (through the generation of eigenvector and eigenvalue). 2. Assign each point in the data a number that corresponds to the eigenvector that reconstructs the majority of the signal in the window around that point. This constitutes a classified volume in which each point contains a number between 1 (that is, the first eigenvector) and N = nx x ny x nz (that is, the last eigenvector). 3. The classification results are then displayed by assigning each value (or group of values) from 1 - N with a unique color or transparency (or combination of these). This procedure is a standardized form of classification of two-dimensional images. When producing categories, still based on the magnitude of the signal in the projected images, instead of a residual of continuous spectrum or projection values, this procedure suffers less from an absence of sensitivity of subtle resources.
[0062] Thus, the present inventive method is advantageous for extracting resources from large, highly dimensional data sets, such as seismic data. The most published methods for applying PCA, for example, in seismic data are similar to the present inventive method just because they can perform the decomposition in a normal way in the data windows. An example is the method of Wu et al. mentioned above. Its approach differs from the present invention in several fundamental ways. First, they apply only small, vertically moving 1D windows to seismic data as input to the PCA. 3D movable windows are used only in flow simulation data. Second, only the first PC is used to reconstruct both the time interval seismic data and the flow simulation data. No other projection or mathematical combination, such as the construction of a residual volume, is performed. Finally, no attempt is made to simultaneously examine multiple seismic volumes, let alone extract patterns intrinsic to seismic data (that is, not linked to a pre-existing geological model).
[0063] Seismic Data Diffusion Mapping
[0064] An approach to extract geologically significant patterns from seismic data is to compute an appropriate representation of the data in some linear space. Typically, this is the result of Principal Component Analysis (PCA) through which data are transformed into linear combinations of basic elements obtained by the method. Some patterns of geological interest violate several hypotheses that the PCA imposes: patterns of equal importance may appear on different scales, their distribution is not necessarily Gaussian, and the pipe that collects them in the data may not be linear. Here, a method has been outlined that addresses all of these concerns while preserving the benefits of PCA. The approach is based on the self-styled Diffusion Map by R.R. Coifman et al .; "Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps”, Proceedings of the National Academy of Sciences, 102 (21), 7426-7431 (2005), which is incorporated into this document as a reference in all jurisdictions that allow the same. As in the case of the PCA, a base is computed (61 in Figure 6) that represents the data. Unlike the PCA, this base is the result of a non-linear transformation that allows a parameter (epsilon ) that defines a notion of scale. Thus, the nonlinearities in the data are captured in a controlled manner. Interestingly, the scale parameter can be adjusted to produce results similar to those of the PCA and the normalization used here was shown by A Singer to be connected to the Spectral Independent Component Analyzes (ICA), Applied Computational Harmonic Analysis, Ami. 21 (2006) 135-144).
[0065] Steps to carry out Seismic Data Diffusion Mapping
em umarranjo de dados Am, n, onde m = 1,...,M , Méo número de voxels de dados por amostra (por exemplo, M = sx x sy x sz para uma janela retangular), n = 1,...,N , e N é o número de amostras, de modo que M « N. 3. Na etapa 73, computar uma matriz de similaridade simétrica , de modo que uma matriz L = M x M, e onde II...II denota uma norma selecionada. Aqui a, e aj, são vetores de linha de comprimento N a partir do arranjo de dados Am, n e M a i, j. Épsilon (e) é um fator de escala predefinido. 4. Na etapa 74, computar uma matriz de normalização diagonal D = Diag(D1,...,D1) com 5. Na etapa 75, computar uma matriz de difusão ao normalizar a matriz de similaridade M = D-1 L. 6. Na etapa 76, computar uma matriz de difusão simétrica Msym = D % MD[0066] In one embodiment, diffusion mapping can be performed with the basic steps which are as follows, with reference to the flowchart of Figure 7: 1. For a given volume of 2D or 3D geophysical data (71) and any strategy determined sampling, for example, potentially random, exhaustive or side-by-side sample, the volume with a sample design {Sx, Sy, Sz}: each sample consists of data points collected as determined by an arbitrary 3-D window, however, of fixed size (for example, an if cube (Sx = Sy = Sz) that moves from one location to another, according to the sampling strategy (step 72). 2. Collect the random samples, that is , the data window vectors,, in a data array Am, n, where m = 1, ..., M, Méo number of data voxels per sample (for example, M = sx x sy x sz for a rectangular window), n = 1, .. ., N, and N is the number of samples, so that M «N. 3. In step 73, compute a symmetric similarity matrix , so that a matrix L = M x M, and where II ... II denotes a selected standard. Here a, and aj, are line vectors of length N from the data array Am, n and M ai, j. Epsilon (e) is a predefined scale factor. 4. In step 74, compute a diagonal normalization matrix D = Diag (D1, ..., D1) with 5. In step 75, compute a diffusion matrix by normalizing the similarity matrix M = D-1 L. 6. In step 76, compute a symmetric diffusion matrix Msym = D% MD
[0067] Uses of Seismic Diffusion Mapping
em todas as localizações possíveis no volume de dados. b. Criar um volume de anomalia inicializado em zero em qualquer lugar. c. Na etapa 79, para cada localização análoga n no volume de anomalia, ajustar o valor em d. Criar pelo menos um volume de mais anomalia usando um valor diferente do parâmetro de escala), e observar as diferenças dependentes de escala nas anomalias; a Figura 6 ilustra dois volumes de anomalia criados em escalas diferentes (62).[0068] The symmetric normalized similarity matrix can be used for data analysis in the following ways: 1. Pattern Analysis: a. In step 77 in Figure 7, decomposing Msym into their eigenvalues and eigenvectors into eig (M), eigenvectors with nonzero eigenvalues represent the scale-dependent bases (e) for pattern analysis. In some cases, eigenvector subsets define a complete pattern of interest. 2. Anomaly Detection (Anomaly Attribute) a. In step 78, using the same window as when M was computed, collect samples in all possible locations on the data volume. B. Create an anomaly volume initialized to zero anywhere. ç. In step 79, for each analog location n in the anomaly volume, adjust the value in d. Create at least one more anomaly volume using a different value than the scale parameter), and observe the scale-dependent differences in the anomalies; Figure 6 illustrates two anomaly volumes created at different scales (62).
[0069] The foregoing application is directed to the particular modalities of the present invention for the purpose of illustrating them. However, it will be apparent to someone skilled in the art that many modifications and variations of the modalities described in this document are possible. All such modifications and variations are intended to be within the scope of the present invention, as defined in the appended claims. People skilled in the field of technology will understand that for practical applications, at least some of the steps of the present inventive method must be performed using a computer programmed in accordance with the teachings of this document.
权利要求:
Claims (10)
[0001]
em um arranjo de dados Am, n, onde m = 1,...,M e n = 1,...,N; computar uma matriz de similaridade L (73) MxM, em que LÍJ é uma medida da diferença entre ai e aj, onde ai e aj. são vetores de linha de comprimento N a partir do arranjo de dados Am,n e M 3 i, j; e em que LÍJ envolve um parâmetro de escala selecionado pelo usuário; formar uma matriz diagonal D (74) a partir da matriz de similaridade, onde computar uma matriz de difusão normalizando-se a matriz de similaridade (75): M = D-1 L; computar uma matriz de difusão simétrica Msym = D1/z MD‘1/2 (76); e ou (i) usar Msym para análise de padrão decompondo Msym seus autovalores e autovetores por eig(Msym) autovetores com autovalores não zero representando as bases dependentes de escala para a análise de padrão; ou (ii) usar Msym para detecção de Anomalia usando o tamanho e forma da janela de dados selecionados, coletar todas as amostras possíveis do volume de dados original (78); projetar todas as amostras de dados no inverso da matriz de difusão simétrica criando, deste modo, um volume de atributo de anomalia dependente de escala (79); e identificar os valores discrepantes ou anomalias no volume de atributo de anomalia dependente de escala; (d) usar os valores discrepantes, anomalias ou padrões para prever os recursos geológicos da região subterrânea.1. Method implemented by computer to identify geological resources in one or more discretized sets of geophysical data or data attribute representing an underground region, each such data set referred to as "original data volume" (71), the method characterized by the fact that you understand the following, at least one of which is performed using a computer: (a) selecting a data window format and size; (b) for each original data volume, moving the window to a plurality of locations, and form for each window location a data window vector whose components consist of voxel values from this window; (c) perform a statistical analysis of the data window vectors, with the statistical analysis being carried out together in the case of a plurality of volumes of original data, in which statistical analysis is performed using diffusion mapping, where the diffusion mapping technique comprises using a transform nonlinear information to represent the data window vectors with a base set, where the nonlinear transformation involves a parameter that defines a notion of scale and where, when there are N data window locations and, therefore, N vectors data window, each with M components, that is, M voxels of data at each data window location; the diffusion mapping technique comprises: collecting the N data window vectors in a data array Am, n, where m = 1, ..., M and n = 1, ..., N; compute a similarity matrix L (73) MxM, where LÍJ is a measure of the difference between ai and aj, where ai and aj. are line vectors of length N from the data array Am, n and M 3 i, j; and where LÍJ involves a scale parameter selected by the user; form a diagonal matrix D (74) from the similarity matrix, where compute a diffusion matrix normalizing the similarity matrix (75): M = D-1 L; compute a symmetric diffusion matrix Msym = D1 / z MD'1 / 2 (76); and or (i) use Msym for pattern analysis by decomposing Msym into their eigenvalues and eigenvectors by eig (Msym) eigenvectors with nonzero eigenvalues representing the scale-dependent bases for pattern analysis; or (ii) use Msym for anomaly detection using the size and shape of the selected data window, collect all possible samples from the original data volume (78); design all the data samples in the inverse of the symmetric diffusion matrix, thus creating a scale-dependent anomaly attribute volume (79); and identify outliers or anomalies in the scale-dependent anomaly attribute volume; (d) use the outliers, anomalies or patterns to predict the geological resources of the underground region.
[0002]
2. Method, according to claim 1, characterized by the fact that it additionally comprises using the predicted geological resources of the underground region to infer oil potential or absence thereof.
[0003]
3. Method for producing hydrocarbons from an underground region, characterized by the fact that it comprises: (a) obtaining data from a geophysical study of the underground region; (b) obtain a prediction of the underground oil potential based at least in part on the physical resources of the identified region using a method, according to claim 1, which is incorporated herein by reference, in the geophysical study data ; (c) in response to a positive forecast of oil potential, drilling a well in the underground region and producing hydrocarbons.
[0004]
h onde e é o parâmetro de escala selecionado pelo usuário, e II...II denota uma norma selecionada.4. Method, according to claim 1, characterized by the fact that h where e is the scale parameter selected by the user, and II ... II denotes a selected standard.
[0005]
5. Method, according to claim 4, characterized by the fact that it additionally comprises performing statistical analysis for at least one additional choice of e.
[0006]
6. Method, according to claim 1, characterized by the fact that M << N.
[0007]
7. Method, according to claim 1, characterized by the fact that using statistical analysis to identify outliers or anomalies in the data comprises computing eigenvectors and eigenvalues of the symmetric diffusion matrix, and using them in the pattern analysis.
[0008]
8. Method, according to claim 1, characterized by the fact that the plurality of locations is determined by a sampling strategy selected from a group consisting of potentially random, exhaustive and side by side.
[0009]
9. Method, according to claim 1, characterized by the fact that: the data window can move to the overlapping positions; each voxel of data in an original data volume is included in at least one window; and a distribution for data values is computed from statistical analysis and is used to identify outliers or anomalies in the data.
[0010]
10. Method, according to claim 9, characterized by the fact that the identification of outliers or anomalies in the data comprises (i) computing a probability of occurrence, or equivalent metric, of each data window in the data value distribution ; and (ii) identify the low probability finger regions as outliers or possible anomalies.
类似技术:
公开号 | 公开日 | 专利标题
BR112012023687B1|2020-11-03|statistical analysis with windows for anomaly detection in geophysical data sets
AU2009314458B2|2014-07-31|Windowed statistical analysis for anomaly detection in geophysical datasets
US9261615B2|2016-02-16|Seismic anomaly detection using double-windowed statistical analysis
Marroquín et al.2009|A visual data-mining methodology for seismic facies analysis: Part 1—Testing and comparison with other unsupervised clustering methods
Unglert et al.2016|Principal component analysis vs. self-organizing maps combined with hierarchical clustering for pattern recognition in volcano seismic spectra
RU2391686C2|2010-06-10|Computer-based formation and inspection of training modes designed for multipoint geostatistical analysis
AlRegib et al.2018|Subsurface structure analysis using computational interpretation and learning: A visual signal processing perspective
Fedi et al.2012|Understanding imaging methods for potential field data
Oruç et al.2011|Interpretation of magnetic data in the Sinop area of Mid Black Sea, Turkey, using tilt derivative, Euler deconvolution, and discrete wavelet transform
US10422900B2|2019-09-24|Analyzing seismic data
Rizk et al.2017|Toward real-time seismic feature analysis for bright spot detection: A distributed approach
Lomask2007|Seismic volumetric flattening and segmentation
Boulassel et al.2021|A new multifractal analysis-based for identifying the reservoir fluid nature
Yao-Jun et al.2020|Unsupervised seismic facies analysis using sparse representation spectral clustering
Troccoli et al.2022|K-means clustering using principal component analysis to automate label organization in multi-attribute seismic facies analysis
Gonzalez et al.2021|Improving Microseismic Denoising Using 4D | Tensors and High-Order Singular Value Decomposition
Elahi Naraghi2016|Modeling complex spatial patterns in reservoir models using high order spectra
Lindsay et al.2011|Categorising features of geological terranes with geodiversity metrics: Enhancing exploration of multiple geological models
Gulbrandsen et al.2017|Revealing Multiple Geological Scenarios Through Unsupervised Clustering of Posterior Realizations from Reflection Seismic Inversion
Farrahani et al.2009|Unsupervised Seismic Facies Analysis Using Continues Wavelet Transform and Self-organizing Maps
Hadiloo et al.0| & A. Edalat |
タイソンスティーブン2009|Analysis of the use of multi-scale secondary data in geostatistical mapping
Lucio2001|Velocity field smoothness via spatial dependence information for seismic imaging
Gulbrandsen et al.0|Revealing Multiple Geological Scenarios Through Unsupervised Clustering of Posterior Realizations from...
同族专利:
公开号 | 公开日
CN102884448A|2013-01-16|
AU2011248992B2|2014-09-25|
CA2793504A1|2011-11-10|
WO2011139416A1|2011-11-10|
EP2567261A1|2013-03-13|
US8380435B2|2013-02-19|
MY164498A|2017-12-29|
AU2011248992A1|2012-11-15|
CA2793504C|2017-02-07|
EP2567261A4|2017-05-10|
NZ603314A|2013-10-25|
RU2554895C2|2015-06-27|
EP2567261B1|2021-06-30|
RU2012152447A|2014-06-20|
JP2013527926A|2013-07-04|
US20110272161A1|2011-11-10|
CN102884448B|2015-07-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US4916615A|1986-07-14|1990-04-10|Conoco Inc.|Method for stratigraphic correlation and reflection character analysis of setsmic signals|
US5047991A|1989-04-28|1991-09-10|Schlumberger Technology Corporation|Lithology identification using sonic data|
US5274714A|1990-06-04|1993-12-28|Neuristics, Inc.|Method and apparatus for determining and organizing feature vectors for neural network recognition|
FR2738920B1|1995-09-19|1997-11-14|Elf Aquitaine|METHOD FOR AUTOMATIC SEISMIC FACIAL RECOGNITION|
CA2220274C|1996-04-12|2005-06-28|Adam Gersztenkorn|Method and apparatus for seismic signal processing and exploration|
US5995448A|1996-11-20|1999-11-30|Krehbiel; Steven|Method for mapping seismic reflective data|
US6466923B1|1997-05-12|2002-10-15|Chroma Graphics, Inc.|Method and apparatus for biomathematical pattern recognition|
US5848379A|1997-07-11|1998-12-08|Exxon Production Research Company|Method for characterizing subsurface petrophysical properties using linear shape attributes|
US5940778A|1997-07-31|1999-08-17|Bp Amoco Corporation|Method of seismic attribute generation and seismic exploration|
GB9904101D0|1998-06-09|1999-04-14|Geco As|Subsurface structure identification method|
GB9819910D0|1998-09-11|1998-11-04|Norske Stats Oljeselskap|Method of seismic signal processing|
US6751354B2|1999-03-11|2004-06-15|Fuji Xerox Co., Ltd|Methods and apparatuses for video segmentation, classification, and retrieval using image class statistical models|
US6430507B1|1999-04-02|2002-08-06|Conoco Inc.|Method for integrating gravity and magnetic inversion with geopressure prediction for oil, gas and mineral exploration and production|
DE19943325C2|1999-09-10|2001-12-13|Trappe Henning|Process for processing seismic measurement data with a neural network|
US6295504B1|1999-10-25|2001-09-25|Halliburton Energy Services, Inc.|Multi-resolution graph-based clustering|
US6226596B1|1999-10-27|2001-05-01|Marathon Oil Company|Method for analyzing and classifying three dimensional seismic information|
US6574566B2|1999-12-27|2003-06-03|Conocophillips Company|Automated feature identification in data displays|
MY125603A|2000-02-25|2006-08-30|Shell Int Research|Processing seismic data|
US6363327B1|2000-05-02|2002-03-26|Chroma Graphics, Inc.|Method and apparatus for extracting selected feature information and classifying heterogeneous regions of N-dimensional spatial data|
GC0000235A|2000-08-09|2006-03-29|Shell Int Research|Processing an image|
US6560540B2|2000-09-29|2003-05-06|Exxonmobil Upstream Research Company|Method for mapping seismic attributes using neural networks|
US6950786B1|2000-10-10|2005-09-27|Schlumberger Technology Corporation|Method and apparatus for generating a cross plot in attribute space from a plurality of attribute data sets and generating a class data set from the cross plot|
US7006085B1|2000-10-30|2006-02-28|Magic Earth, Inc.|System and method for analyzing and imaging three-dimensional volume data sets|
US6597994B2|2000-12-22|2003-07-22|Conoco Inc.|Seismic processing system and method to determine the edges of seismic data events|
US20020169735A1|2001-03-07|2002-11-14|David Kil|Automatic mapping from data to preprocessing algorithms|
US7069149B2|2001-12-14|2006-06-27|Chevron U.S.A. Inc.|Process for interpreting faults from a fault-enhanced 3-dimensional seismic attribute volume|
US6766252B2|2002-01-24|2004-07-20|Halliburton Energy Services, Inc.|High resolution dispersion estimation in acoustic well logging|
US20050288863A1|2002-07-12|2005-12-29|Chroma Energy, Inc.|Method and system for utilizing string-length ratio in seismic analysis|
US20060184488A1|2002-07-12|2006-08-17|Chroma Energy, Inc.|Method and system for trace aligned and trace non-aligned pattern statistical calculation in seismic analysis|
US7184991B1|2002-07-12|2007-02-27|Chroma Energy, Inc.|Pattern recognition applied to oil exploration and production|
US7308139B2|2002-07-12|2007-12-11|Chroma Energy, Inc.|Method, system, and apparatus for color representation of seismic data and associated measurements|
US7162463B1|2002-07-12|2007-01-09|Chroma Energy, Inc.|Pattern recognition template construction applied to oil exploration and production|
US7295706B2|2002-07-12|2007-11-13|Chroma Group, Inc.|Pattern recognition applied to graphic imaging|
US7188092B2|2002-07-12|2007-03-06|Chroma Energy, Inc.|Pattern recognition template application applied to oil exploration and production|
US6807486B2|2002-09-27|2004-10-19|Weatherford/Lamb|Method of using underbalanced well data for seismic attribute analysis|
US6868341B2|2002-12-23|2005-03-15|Schlumberger Technology Corporation|Methods and apparatus for processing acoustic waveforms received in a borehole|
US7298376B2|2003-07-28|2007-11-20|Landmark Graphics Corporation|System and method for real-time co-rendering of multiple attributes|
US20050171700A1|2004-01-30|2005-08-04|Chroma Energy, Inc.|Device and system for calculating 3D seismic classification features and process for geoprospecting material seams|
EP1707993B1|2005-03-29|2009-08-19|Total S.A.|Method and computer program for the determination of geological discontinuities|
US7379386B2|2006-07-12|2008-05-27|Westerngeco L.L.C.|Workflow for processing streamer seismic data|
US8014880B2|2006-09-29|2011-09-06|Fisher-Rosemount Systems, Inc.|On-line multivariate analysis in a distributed process control system|
MY159169A|2008-11-14|2016-12-30|Exxonmobil Upstream Res Co|Windowed statistical analysis for anomaly detection in geophysical datasets|
US8380435B2|2010-05-06|2013-02-19|Exxonmobil Upstream Research Company|Windowed statistical analysis for anomaly detection in geophysical datasets|EP1956554B1|2007-02-09|2009-10-07|Agfa-Gevaert|Visual enhancement of interval changes using a temporal subtraction technique|
US8380435B2|2010-05-06|2013-02-19|Exxonmobil Upstream Research Company|Windowed statistical analysis for anomaly detection in geophysical datasets|
US10310119B2|2011-06-24|2019-06-04|Ion Geophysical Corporation|Method and apparatus for seismic noise reduction|
US9798027B2|2011-11-29|2017-10-24|Exxonmobil Upstream Research Company|Method for quantitative definition of direct hydrocarbon indicators|
WO2013106720A1|2012-01-12|2013-07-18|Schlumberger Canada Limited|Method for constrained history matching coupled with optimization|
EP2815255B1|2012-02-13|2017-03-01|Exxonmobil Upstream Research Company|System and method for detection and classification of seismic terminations|
CA2867170C|2012-05-23|2017-02-14|Exxonmobil Upstream Research Company|Method for analysis of relevance and interdependencies in geoscience data|
US9261615B2|2012-06-15|2016-02-16|Exxonmobil Upstream Research Company|Seismic anomaly detection using double-windowed statistical analysis|
CN102879823B|2012-09-28|2015-07-22|电子科技大学|Method for fusing seismic attributes on basis of fast independent component analysis|
US20140129149A1|2012-11-02|2014-05-08|Schlumberger Technology Corporation|Formation Evaluation Using Hybrid Well Log Datasets|
US10422900B2|2012-11-02|2019-09-24|Exxonmobil Upstream Research Company|Analyzing seismic data|
US9529115B2|2012-12-20|2016-12-27|Exxonmobil Upstream Research Company|Geophysical modeling of subsurface volumes based on horizon extraction|
WO2014099200A1|2012-12-20|2014-06-26|Exxonmobil Upstream Research Company|Vector based geophysical modeling of subsurface volumes|
WO2014099202A1|2012-12-20|2014-06-26|Exxonmobil Upstream Research Company|Method and system for geophysical modeling of subsurface volumes based on label propagation|
WO2014099204A1|2012-12-20|2014-06-26|Exxonmobil Upstream Research Company|Method and system for geophysical modeling of subsurface volumes based on computed vectors|
JP6013178B2|2012-12-28|2016-10-25|株式会社東芝|Image processing apparatus and image processing method|
US9829591B1|2013-01-07|2017-11-28|IHS Global, Inc.|Determining seismic stratigraphic features using a symmetry attribute|
US20140269186A1|2013-03-14|2014-09-18|Chevron U.S.A. Inc.|System and method for isolating signal in seismic data|
WO2014149344A1|2013-03-15|2014-09-25|Exxonmobil Upstream Research Company|Method and system for geophysical modeling of subsurface volumes|
WO2014150580A1|2013-03-15|2014-09-25|Exxonmobil Upstream Research Company|Method for geophysical modeling of subsurface volumes|
US9333497B2|2013-03-29|2016-05-10|Exxonmobil Research And Engineering Company|Mitigation of plugging in hydroprocessing reactors|
US20140358440A1|2013-05-31|2014-12-04|Chevron U.S.A. Inc.|System and Method For Characterizing Geological Systems Using Statistical Methodologies|
US9824135B2|2013-06-06|2017-11-21|Exxonmobil Upstream Research Company|Method for decomposing complex objects into simpler components|
CN104424393B|2013-09-11|2017-10-20|中国石油化工股份有限公司|A kind of geological data reservoir reflectance signature based on principal component analysis strengthens method|
US10663609B2|2013-09-30|2020-05-26|Saudi Arabian Oil Company|Combining multiple geophysical attributes using extended quantization|
US9990568B2|2013-11-29|2018-06-05|Ge Aviation Systems Limited|Method of construction of anomaly models from abnormal data|
US20150308191A1|2014-04-29|2015-10-29|Sinopec Tech Houston, LLC.|System and method for monitoring drilling systems|
US10359523B2|2014-08-05|2019-07-23|Exxonmobil Upstream Research Company|Exploration and extraction method and system for hydrocarbons|
WO2016067254A1|2014-10-30|2016-05-06|Koninklijke Philips N.V.|Texture analysis map for image data|
US11163092B2|2014-12-18|2021-11-02|Exxonmobil Upstream Research Company|Scalable scheduling of parallel iterative seismic jobs|
US10267934B2|2015-01-13|2019-04-23|Chevron U.S.A. Inc.|System and method for generating a depositional sequence volume from seismic data|
WO2016118223A1|2015-01-22|2016-07-28|Exxonmobil Upstream Research Company|Adaptive structure-oriented operator|
WO2016171778A1|2015-04-24|2016-10-27|Exxonmobil Upstream Research Company|Seismic stratigraphic surface classification|
KR101656862B1|2016-03-15|2016-09-13|한국지질자원연구원|Apparatus and method for performing stochastic modeling of earthquake fault rupture|
US11205103B2|2016-12-09|2021-12-21|The Research Foundation for the State University|Semisupervised autoencoder for sentiment analysis|
US10394883B2|2016-12-29|2019-08-27|Agrian, Inc.|Classification technique for multi-band raster data for sorting and processing of colorized data for display|
CN106934208B|2017-01-05|2019-07-23|国家能源局大坝安全监察中心|A kind of dam thundering observed data automatic identifying method|
US10157319B2|2017-02-22|2018-12-18|Sas Institute Inc.|Monitoring, detection, and surveillance system using principal component analysis with machine and sensor data|
CN107764697A|2017-10-13|2018-03-06|中国石油化工股份有限公司|Gas potential detection method based on the progressive equation non-linear inversion of pore media|
US11112516B2|2018-04-30|2021-09-07|Schlumberger Technology Corporation|Data fusion technique to compute reservoir quality and completion quality by combining various log measurements|
US11187820B1|2019-06-18|2021-11-30|Euram Geo-Focus Technologies Corporation|Methods of oil and gas exploration using digital imaging|
US11249220B2|2019-08-14|2022-02-15|Chevron U.S.A. Inc.|Correlation matrix for simultaneously correlating multiple wells|
US11231407B2|2019-09-23|2022-01-25|Halliburton Energy Services, Inc.|System and method for graphene-structure detection downhole|
US11187826B2|2019-12-06|2021-11-30|Chevron U.S.A. Inc.|Characterization of subsurface regions using moving-window based analysis of unsegmented continuous data|
US11263362B2|2020-01-16|2022-03-01|Chevron U.S.A. Inc.|Correlation of multiple wells using subsurface representation|
法律状态:
2019-01-08| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-11-05| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-06-16| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2020-11-03| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 17/03/2011, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US12/775,226|2010-05-06|
US12/775,226|US8380435B2|2010-05-06|2010-05-06|Windowed statistical analysis for anomaly detection in geophysical datasets|
PCT/US2011/028851|WO2011139416A1|2010-05-06|2011-03-17|Windowed statistical analysis for anomaly detection in geophysical datasets|
[返回顶部]